Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data
نویسندگان
چکیده
Subspace learning and matrix factorization problems have a great many applications in science and engineering, and efficient algorithms are critical as dataset sizes continue to grow. Many relevant problem formulations are non-convex, and in a variety of contexts it has been observed that solving the non-convex problem directly is not only efficient but reliably accurate. We discuss convergence theory for a particular method: first order incremental gradient descent constrained to the Grassmannian. The output of the algorithm is an orthonormal basis for a d-dimensional subspace spanned by an input streaming data matrix. We study two sampling cases: where each data vector of the streaming matrix is fully sampled, or where it is undersampled by a sampling matrix At ∈ Rm×n with m n. We propose an adaptive stepsize scheme that depends only on the sampled data and algorithm outputs. We prove that with fully sampled data, the stepsize scheme maximizes the improvement of our convergence metric at each iteration, and this method converges from any random initialization to the true subspace, despite the non-convex formulation and orthogonality constraints. For the case of undersampled data, we establish monotonic improvement on the defined convergence metric for each iteration with high probability.
منابع مشابه
Global Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation
It has been observed in a variety of contexts that gradient descent methods have great success in solving low-rank matrix factorization problems, despite the relevant problem formulation being non-convex. We tackle a particular instance of this scenario, where we seek the d-dimensional subspace spanned by a streaming data matrix. We apply the natural first order incremental gradient descent met...
متن کاملAdaptive Stochastic Gradient Descent on the Grassmannian for Robust Low-Rank Subspace Recovery
In this paper, we present GASG21 (Grassmannian Adaptive Stochastic Gradient for L2,1 norm minimization), an adaptive stochastic gradient algorithm to robustly recover the low-rank subspace from a large matrix. In the presence of column outliers corruption, we reformulate the classical matrix L2,1 norm minimization problem as its stochastic programming counterpart. For each observed data vector,...
متن کاملOnline Supervised Subspace Tracking
We present a framework for supervised subspace tracking, when there are two time series xt and yt, one being the high-dimensional predictors and the other being the response variables and the subspace tracking needs to take into consideration of both sequences. It extends the classic online subspace tracking work which can be viewed as tracking of xt only. Our online sufficient dimensionality r...
متن کاملEnhanced Online Subspace Estimation via Adaptive Sensing
This work investigates the problem of adaptive measurement design for online subspace estimation from compressive linear measurements. We study the previously proposed Grassmannian rank-one online subspace estimation (GROUSE) algorithm with adaptively designed compressive measurements. We propose an adaptive measurement scheme that biases the measurement vectors towards the current subspace est...
متن کاملLocal Convergence of an Algorithm for Subspace Identification from Partial Data
GROUSE (Grassmannian Rank-One Update Subspace Estimation) is an iterative algorithm for identifying a linear subspace of Rn from data consisting of partial observations of random vectors from that subspace. This paper examines local convergence properties of GROUSE, under assumptions on the randomness of the observed vectors, the randomness of the subset of elements observed at each iteration, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1610.00199 شماره
صفحات -
تاریخ انتشار 2016